27,278 research outputs found

    Bell's inequality and the coincidence-time loophole

    Get PDF
    This paper analyzes effects of time-dependence in the Bell inequality. A generalized inequality is derived for the case when coincidence and non-coincidence [and hence whether or not a pair contributes to the actual data] is controlled by timing that depends on the detector settings. Needless to say, this inequality is violated by quantum mechanics and could be violated by experimental data provided that the loss of measurement pairs through failure of coincidence is small enough, but the quantitative bound is more restrictive in this case than in the previously analyzed "efficiency loophole."Comment: revtex4, 3 figures, v2: epl document class, reformatted w slight change

    Comment on "Exclusion of time in the theorem of Bell" by K. Hess and W. Philipp

    Full text link
    A recent Letter by Hess and Philipp claims that Bell's theorem neglects the possibility of time-like dependence in local hidden variables, hence is not conclusive. Moreover the authors claim that they have constructed, in an earlier paper, a local realistic model of the EPR correlations. However, they themselves have neglected the experimenter's freedom to choose settings, while on the other hand, Bell's theorem can be formulated to cope with time-like dependence. This in itself proves that their toy model cannot satisfy local realism, but we also indicate where their proof of its local realistic nature fails.Comment: Latex needs epl.cl

    A geometric proof of the Kochen-Specker no-go theorem

    Full text link
    We give a short geometric proof of the Kochen-Specker no-go theorem for non-contextual hidden variables models. Note added to this version: I understand from Jan-Aake Larsson that the construction we give here actually contains the original Kochen-Specker construction as well as many others (Bell, Conway and Kochen, Schuette, perhaps also Peres).Comment: This paper appeared some years ago, before the author was aware of quant-ph. It is relevant to recent developments concerning Kochen-Specker theorem

    Fisher information in quantum statistics

    Full text link
    Braunstein and Caves (1994) proposed to use Helstrom's {\em quantum information} number to define, meaningfully, a metric on the set of all possible states of a given quantum system. They showed that the quantum information is nothing else than the maximal Fisher information in a measurement of the quantum system, maximized over all possible measurements. Combining this fact with classical statistical results, they argued that the quantum information determines the asymptotically optimal rate at which neighbouring states on some smooth curve can be distinguished, based on arbitrary measurements on nn identical copies of the given quantum system. We show that the measurement which maximizes the Fisher information typically depends on the true, unknown, state of the quantum system. We close the resulting loophole in the argument by showing that one can still achieve the same, optimal, rate of distinguishability, by a two stage adaptive measurement procedure. When we consider states lying not on a smooth curve, but on a manifold of higher dimension, the situation becomes much more complex. We show that the notion of ``distinguishability of close-by states'' depends strongly on the measurement resources one allows oneself, and on a further specification of the task at hand. The quantum information matrix no longer seems to play a central role.Comment: This version replaces the previous versions of February 1999 (titled 'An Example of Non-Attainability of Expected Quantum Information') and that of November 1999. Proofs and results are much improved. To appear in J. Phys.

    Mechanism of the photovoltaic effect in 2-6 compounds Progress report, 1 Oct. 1968 - 31 Mar. 1969

    Get PDF
    Heat treatment, illumination and darkness effects, and photovoltaic properties of Cu2S-CdS heterojunction

    An invitation to quantum tomography (II)

    Get PDF
    The quantum state of a light beam can be represented as an infinite dimensional density matrix or equivalently as a density on the plane called the Wigner function. We describe quantum tomography as an inverse statistical problem in which the state is the unknown parameter and the data is given by results of measurements performed on identical quantum systems. We present consistency results for Pattern Function Projection Estimators as well as for Sieve Maximum Likelihood Estimators for both the density matrix of the quantum state and its Wigner function. Finally we illustrate via simulated data the performance of the estimators. An EM algorithm is proposed for practical implementation. There remain many open problems, e.g. rates of convergence, adaptation, studying other estimators, etc., and a main purpose of the paper is to bring these to the attention of the statistical community.Comment: An earlier version of this paper with more mathematical background but less applied statistical content can be found on arXiv as quant-ph/0303020. An electronic version of the paper with high resolution figures (postscript instead of bitmaps) is available from the authors. v2: added cross-validation results, reference

    Estimation in a growth study with irregular measurement times

    Get PDF
    Between 1982 and 1988 a growth study was carried out at the Division of Pediatric Oncology of the University Hospital of Groningen. A special feature of the project was that sample sizes are small and that ages at entry may be very different. In addition the intended design was not fully complied with. This paper highlights some aspects of the statistical analysis which is based on (1) reference scores, (2) statistical procedures allowing for an irregular pattern of measurement times caused by missing data and shifted measurement times

    Anna Karenina and The Two Envelopes Problem

    Full text link
    The Anna Karenina principle is named after the opening sentence in the eponymous novel: Happy families are all alike; every unhappy family is unhappy in its own way. The Two Envelopes Problem (TEP) is a much-studied paradox in probability theory, mathematical economics, logic, and philosophy. Time and again a new analysis is published in which an author claims finally to explain what actually goes wrong in this paradox. Each author (the present author included) emphasizes what is new in their approach and concludes that earlier approaches did not get to the root of the matter. We observe that though a logical argument is only correct if every step is correct, an apparently logical argument which goes astray can be thought of as going astray at different places. This leads to a comparison between the literature on TEP and a successful movie franchise: it generates a succession of sequels, and even prequels, each with a different director who approaches the same basic premise in a personal way. We survey resolutions in the literature with a view to synthesis, correct common errors, and give a new theorem on order properties of an exchangeable pair of random variables, at the heart of most TEP variants and interpretations. A theorem on asymptotic independence between the amount in your envelope and the question whether it is smaller or larger shows that the pathological situation of improper priors or infinite expectation values has consequences as we merely approach such a situation.Comment: Final corrections (fingers crossed
    corecore